python module per simulator template permutation#1047
python module per simulator template permutation#1047nicolasaunai merged 6 commits intoPHAREHUB:masterfrom
Conversation
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📝 WalkthroughWalkthroughIntroduces a SamraiLifeCycle class and migrates SAMRAI usage to this lifecycle. Refactors Python C++ bindings from per-dimension modules to per-simulator-ID pybind modules, centralizes cpp access in a pyphare.cpp namespace, and updates build/test CMake targets and many tests to the new interfaces. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Py as Python test/script
participant Pyphare as pyphare.cpp (module loader)
participant PB as pybind cpp_<sim> module
participant PH as PHARE SamraiLifeCycle
participant SAM as SAMRAI
Py->>Pyphare: import cpp
Pyphare->>PB: load cpp_<sim> (simulator_id)
PB->>PH: SamraiLifeCycle() / make_hierarchy()
PH->>SAM: initialize MPI & SAMRAI manager (rgba(70,130,180,0.5))
Py->>PB: make_simulator(hierarchy)
PB->>PH: access getDatabase/getRestartManager
PH->>SAM: delegate calls (rgba(34,139,34,0.5))
Estimated code review effort🎯 4 (Complex) | ⏱️ ~65 minutes Possibly related PRs
Suggested labels
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
ca57af8 to
3d0cf5b
Compare
3d0cf5b to
bb02c3c
Compare
7bb9995 to
be7ebda
Compare
be7ebda to
717ecf5
Compare
c7b284b to
1c8de66
Compare
There was a problem hiding this comment.
Actionable comments posted: 20
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/python3/patch_level.hpp (1)
4-11: Duplicate include detected.
"patch_data.hpp"on line 4 and"python3/patch_data.hpp"on line 11 resolve to the same header file since this file resides insrc/python3/. One of these includes should be removed.Proposed fix
`#include` "patch_data.hpp" `#include` "phare_solver.hpp" `#include` <string> `#include` <cstring> `#include` <cstddef> -#include "python3/patch_data.hpp" -
🤖 Fix all issues with AI agents
In `@res/__tc/clang_samraisub/0_config.sh`:
- Around line 1-6: Add a shebang at the top of the script (e.g., #!/bin/bash or
#!/bin/sh) and move the existing set -ex invocation to immediately after the
shebang so error tracing and command echoing are enabled before any commands run
(currently pwd appears before set -ex); keep the remaining commands (pwd, ulimit
-a, clang -v, ldd --version, ld -v) in their original order.
In `@res/__tc/clang_samraisub/1_build.sh`:
- Around line 1-3: The script lacks an explicit shebang which can cause
ambiguous shell selection; add a Bash shebang (e.g. #!/usr/bin/env bash) as the
first line before the existing "set -ex" so the script always runs under Bash
and preserves the expected behavior of "set -ex" and the N_CORES check ([ -z
"$N_CORES" ] && ...). Ensure the shebang precedes any other text in the file.
In `@res/__tc/clang_samraisub/2_test.sh`:
- Around line 1-18: Add a portable shebang line (e.g. #!/usr/bin/env bash) as
the very first line of the script before the existing "set -ex" so the
interpreter is explicit and consistent across environments; keep the rest of the
file (environment exports and the subshell block starting with "(" and the "cd
build" / "ctest -j$N_CORES --output-on-failure" commands) unchanged.
In `@res/__tc/clang/0_config.sh`:
- Around line 1-2: Add a shebang as the first line of the script so it always
runs under the intended shell (e.g., use /usr/bin/env bash); ensure the shebang
precedes the existing commands like the initial "pwd" and the "set -ex" line so
the script is executed by bash.
In `@res/__tc/clang/1_build.sh`:
- Line 1: Add a bash shebang at the top of the script so the bash-specific
setting "set -ex" runs under a consistent shell; update the file to start with a
line like "#!/usr/bin/env bash" (or "#!/bin/bash") before the existing "set -ex"
so the script is executed with bash regardless of how it's invoked.
In `@res/__tc/clang/2_test.sh`:
- Line 1: The test script res/__tc/clang/2_test.sh is missing a shebang so its
shell behavior can vary; add a proper shebang as the very first line (for
example using /usr/bin/env bash) so that the script runs with a consistent shell
when executing the commands like set -ex, and ensure the shebang precedes any
other content in the file.
In `@res/__tc/clang/3_test_mpi.sh`:
- Around line 1-2: The script is missing a shebang and defines GCC_ASAN_PRELOAD
but never uses it; add a proper shebang (e.g., #!/usr/bin/env bash) at the top
of the file to ensure portability, and either remove the unused GCC_ASAN_PRELOAD
assignment or replace the reference to CLANG_ASAN_PRELOAD with GCC_ASAN_PRELOAD
where appropriate so the intended preload variable is actually used (search for
GCC_ASAN_PRELOAD and CLANG_ASAN_PRELOAD in the script to decide which variable
should be kept/used).
In `@res/__tc/gcc_samraisub/0_config.sh`:
- Around line 1-2: The script currently starts with commands "pwd" and "set -ex"
but lacks a shebang, triggering shellcheck SC2148; add a proper shebang line
(for example "#!/usr/bin/env bash") as the first line of the file so the script
declares its target shell and enables shell-specific linting for the existing
"set -ex" usage.
In `@res/__tc/gcc_samraisub/2_test.sh`:
- Around line 1-2: Add an explicit shebang at the top of the script to ensure
portable execution when run directly: modify the file containing the line "set
-ex" (script 2_test.sh) to insert the appropriate shell shebang (e.g.,
#!/usr/bin/env bash) as the very first line so the interpreter is defined before
the existing set -ex invocation.
In `@res/__tc/gcc/0_config.sh`:
- Around line 1-2: Add a shebang line at the top of the script so the shell
interpreter is explicit; insert e.g. "#!/usr/bin/env bash" as the first line
above the existing commands (the block containing "pwd" and "set -ex") so the
script runs with the intended shell interpreter.
In `@res/__tc/gcc/1_build.sh`:
- Around line 1-2: The script currently only contains "set -ex" which leaves the
interpreter ambiguous; add a shebang as the first line (for example
"#!/usr/bin/env bash") above the existing set -ex so the shell interpreter is
explicit and the script can be executed directly.
In `@res/__tc/gcc/2_test.sh`:
- Around line 1-3: The script lacks a shebang, causing inconsistent shell
selection; add a shebang as the very first line (for example "#!/usr/bin/env
bash") so the script runs with a known shell, leaving the existing "set -ex" and
the N_CORES check ([ -z "$N_CORES" ] && echo ... && exit 1) unchanged and
functioning as before.
In `@res/__tc/gcc/3_test_mpi.sh`:
- Line 1: Add a shebang as the first line to explicitly set the shell
interpreter (the script currently starts with the line "set -ex"); prepend
something like "#!/usr/bin/env bash" above "set -ex" so the script always runs
under bash and avoids relying on the invoking shell.
In `@res/__tc/nightly/0_config.sh`:
- Around line 1-6: Add a shebang (e.g., #!/usr/bin/env bash) at the top of the
script and move the set -ex line to immediately after the shebang so
tracing/debug flags apply to all commands; ensure the rest of the commands (pwd,
ulimit -a, gcc -v, ldd --version, ld -v) remain in their original order
following set -ex so they run with -x and -e enabled.
In `@res/__tc/nightly/1_build.sh`:
- Around line 1-2: Add an explicit shebang as the very first line of the script
so the interpreter is unambiguous; place a bash shebang (for example, using
/usr/bin/env bash) immediately above the existing set -ex line in the script to
ensure it runs with the intended shell.
In `@res/__tc/nightly/2_test.sh`:
- Line 1: The script currently starts with "set -ex" but lacks a shebang; add a
shebang as the first line (for example "#!/usr/bin/env bash" or the same
interpreter used in 3_test_mpi.sh) so the shell interpreter is explicit, keeping
the existing "set -ex" line immediately after the shebang; ensure the shebang is
the very first line of the 2_test.sh file.
- Line 6: The MODULEPATH export line contains a duplicate entry
'/etc/scl/modulefiles'; update the export of MODULEPATH (the "export
MODULEPATH=...") to remove the repeated '/etc/scl/modulefiles' so each path
appears only once (i.e., ensure the colon-separated list in MODULEPATH is
de-duplicated), or replace the literal assignment with a small dedup step that
builds MODULEPATH uniquely before exporting.
In `@res/__tc/nightly/3_test_mpi.sh`:
- Line 6: The MODULEPATH environment variable export contains a duplicate entry
"/etc/scl/modulefiles"; update the export statement that sets MODULEPATH so each
path appears only once (remove the redundant "/etc/scl/modulefiles") to avoid
duplicated paths when MODULEPATH is used.
- Line 1: Add an explicit bash shebang as the very first line of the script (use
env to locate bash) so the system knows which shell to run, placing it before
the existing "set -ex" line; ensure the shebang is the file's first line so the
interpreter is selected when the script is executed directly.
In `@src/python3/data_wrangler.hpp`:
- Around line 76-87: The DataWrangler constructors currently dereference the
passed shared_ptr and store a reference (simulator_) which can dangle and lacks
null checks; update the class to hold a std::shared_ptr<ISimulator> (or
std::shared_ptr<Simulator> as appropriate) as a member (e.g., simulator_ptr_)
and in both constructors validate the incoming shared_ptr (throw or return/error
on null) before assigning simulator_ptr_ = simulator (or =
cast_simulator(simulator)); change uses of simulator_ to use
simulator_ptr_.get() or dereference when needed so ownership is retained and
null is guarded against.
♻️ Duplicate comments (2)
tests/core/data/field/test_field.hpp (1)
7-10: Add missing standard headers for used symbols.The file uses
std::isnan,std::ostringstream,std::runtime_error,std::string, andstd::vectorwithout direct includes; relying on transitive includes is fragile and can break builds.♻️ Proposed fix
`#include` "gtest/gtest.h" // EXPECT_FLOAT_EQ `#include` <cassert> +#include <cmath> +#include <sstream> +#include <stdexcept> +#include <string> +#include <vector>src/python3/cpp_simulator.hpp (1)
118-121: Duplicate: makeSimulator should be called without a template parameter.
This was already flagged in prior review comments for this file.
🧹 Nitpick comments (7)
res/__tc/gcc/3_test_mpi.sh (1)
21-22: Quote variables to prevent word splitting issues.While
N_CORESis validated to be non-empty, quoting the variable is a shell best practice to prevent potential issues with whitespace or special characters.Proposed fix
- ninja -j$N_CORES -v - ctest -j$N_CORES --output-on-failure + ninja -j"$N_CORES" -v + ctest -j"$N_CORES" --output-on-failurepyphare/pyphare/data/wrangler.py (1)
6-14: Replacegetattrwith direct attribute access.Using
getattrwith a constant attribute name ("DataWrangler") provides no benefit over direct attribute access and reduces readability.Proposed fix
- self.cpp = getattr(cpp.cpp_lib(sim), "DataWrangler")( - simulator.cpp_sim, simulator.cpp_hier - ) + self.cpp = cpp.cpp_lib(sim).DataWrangler( + simulator.cpp_sim, simulator.cpp_hier + )src/amr/samrai.hpp (2)
5-30: Prefer explicit Logger include + non-null StreamAppender handle.
StreamAppenderinherits fromSAMRAI::tbox::Logger::Appender, but the header doesn’t include the Logger definition and accepts a nullable pointer. Consider a direct include and a reference to prevent null deref; also addoverridefor safer polymorphism.♻️ Proposed change (header)
+#include <SAMRAI/tbox/Logger.h> ... -class StreamAppender : public SAMRAI::tbox::Logger::Appender +class StreamAppender : public SAMRAI::tbox::Logger::Appender { public: - StreamAppender(std::ostream* stream) { d_stream = stream; } - void logMessage(std::string const& message, std::string const& filename, int const line) + explicit StreamAppender(std::ostream& stream) : d_stream(stream) {} + void logMessage(std::string const& message, std::string const& filename, int const line) override { - (*d_stream) << "At :" << filename << " line :" << line << " message: " << message - << std::endl; + d_stream << "At :" << filename << " line :" << line << " message: " << message + << std::endl; } private: - std::ostream* d_stream; + std::ostream& d_stream; };If you adopt this, update construction to pass
std::coutby reference.
33-46: Prevent accidental copies ofSamraiLifeCycle.Copying this type would risk multiple shutdown/finalize calls on SAMRAI/MPI. Consider deleting copy/move to enforce singleton-like semantics.
Based on learnings, SamraiLifeCycle is intended to live for the program lifetime.♻️ Proposed change
class SamraiLifeCycle // { public: SamraiLifeCycle(int argc = 0, char** argv = nullptr); ~SamraiLifeCycle(); + + SamraiLifeCycle(SamraiLifeCycle const&) = delete; + SamraiLifeCycle& operator=(SamraiLifeCycle const&) = delete; + SamraiLifeCycle(SamraiLifeCycle&&) = delete; + SamraiLifeCycle& operator=(SamraiLifeCycle&&) = delete;res/__tc/clang/3_test_mpi.sh (1)
21-23: Drop the unused GCC_ASAN_PRELOAD (or wire it in).
Currently only CLANG_ASAN_PRELOAD is used.♻️ Proposed cleanup
-GCC_ASAN_PRELOAD=$(gcc -print-file-name=libasan.so) CLANG_ASAN_PRELOAD=$(clang -print-file-name=libclang_rt.asan.so)res/__tc/clang/2_test.sh (1)
21-22:GCC_ASAN_PRELOADis unused.The variable is defined but only
CLANG_ASAN_PRELOADis used in the test execution. Consider removing it or adding a comment explaining why it's kept (e.g., for reference or future GCC testing).Proposed fix (remove unused variable)
export ASAN_SYMBOLIZER_PATH=/usr/bin/llvm-symbolizer export ASAN_OPTIONS=detect_leaks=0 -GCC_ASAN_PRELOAD=$(gcc -print-file-name=libasan.so) CLANG_ASAN_PRELOAD=$(clang -print-file-name=libclang_rt.asan.so)pyphare/pyphare/cpp/__init__.py (1)
39-56: Replacegetattrwith direct attribute access.Using
getattrwith constant attribute names provides no safety benefit over direct attribute access and reduces code clarity.Proposed fix
def splitter_type(sim): - return getattr(cpp_lib(sim), "Splitter") + return cpp_lib(sim).Splitter def split_pyarrays_fn(sim): - return getattr(cpp_lib(sim), "split_pyarray_particles") + return cpp_lib(sim).split_pyarray_particles def mpi_rank(): - return getattr(cpp_etc_lib(), "mpi_rank")() + return cpp_etc_lib().mpi_rank() def mpi_size(): - return getattr(cpp_etc_lib(), "mpi_size")() + return cpp_etc_lib().mpi_size() def mpi_barrier(): - return getattr(cpp_etc_lib(), "mpi_barrier")() + return cpp_etc_lib().mpi_barrier()
45d0232 to
6356a75
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@ISSUES.TXT`:
- Around line 33-34: Update the typo in the documentation line that currently
reads "shared libaries," to "shared libraries,"; locate the sentence referencing
the CMake flag `-DSAMRAI_ROOT=/path` and replace the misspelled token "libaries"
with the correct spelling "libraries" so the sentence reads "By default, PHARE
will build SAMRAI in place with shared libraries,".
In `@res/cmake/dep/samrai.cmake`:
- Around line 27-29: The option call for BUILD_SHARED_LIBS doesn't override a
cached value, so replace the option/conditional behavior in the
SAMRAI_BUILD_SHARED_LIBS branch by forcing the cache variable: set
BUILD_SHARED_LIBS to ON in the CMake cache (e.g. use set(BUILD_SHARED_LIBS ON
CACHE BOOL "Make shared libs" FORCE)) when SAMRAI_BUILD_SHARED_LIBS is true so
the shared-libs flag actually takes effect; update the logic around
SAMRAI_BUILD_SHARED_LIBS to use this cache-set approach instead of option().
In `@src/amr/wrappers/hierarchy.hpp`:
- Around line 47-63: The constructor HierarchyRestarter registers patch data IDs
via pdrm->registerPatchDataForRestart for each entry in
sim_dict["simulation"]["restarts"]["restart_ids"] before calling
SamraiLifeCycle::getRestartManager()->openRestartFile; if openRestartFile throws
those registrations will leak. Modify the constructor to perform the
registrations inside a try block (or use a simple RAII/scope-guard object) and
if openRestartFile throws catch the exception, unregister the same IDs via
pdrm->unregisterPatchDataForRestart (or the appropriate unregister API) and
rethrow; also add a short inline comment near restart_ids explaining what those
IDs represent and why they must be registered prior to opening the restart file.
♻️ Duplicate comments (1)
src/amr/wrappers/hierarchy.hpp (1)
65-75: Add a comment explaining why patch data IDs are unregistered.Per the previous review feedback, the purpose of unregistering these IDs in the destructor is unclear to readers. A brief comment explaining that this cleanup is necessary to prevent stale registrations when the hierarchy is destroyed (especially relevant for supporting multiple sequential simulations via
SamraiLifeCycle::reset()) would improve maintainability.
7914e1b to
26d4666
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@res/cmake/dep/samrai.cmake`:
- Around line 7-38: When SAMRAI_ROOT is provided the code must validate that the
discovered SAMRAI installation is built as shared libs; after the existing
find_package(SAMRAI ...) call add a guard that inspects the SAMRAI build-type
variable (e.g. SAMRAI_BUILD_SHARED_LIBS or the equivalent exported variable set
by SAMRAI's config) and if it indicates static libraries, abort with a clear
message (message(FATAL_ERROR ...)) explaining static SAMRAI is unsupported
(reference ISSUES.TXT) and instructing the user to provide a shared build or
unset SAMRAI_ROOT; place this check in the same branch that handles SAMRAI_ROOT
so the add_subdirectory/static-enforcement logic in the else branch is
symmetric.
In `@src/python3/patch_level.hpp`:
- Around line 11-12: Remove the duplicate include of the same header in
src/python3/patch_level.hpp: keep a single include of patch_data.hpp and delete
the extra line that includes "python3/patch_data.hpp" so only one `#include` for
patch_data.hpp remains (locate the duplicated includes near the top of the file
where "patch_data.hpp" and "python3/patch_data.hpp" are present).
In `@tests/functional/shock/shock.py`:
- Around line 4-15: Move the ph.NO_GUI() call (and the import of ph) so it
occurs before importing matplotlib.pyplot: import pyphare.pharein as ph and call
ph.NO_GUI() prior to the line "import matplotlib.pyplot as plt" so matplotlib's
backend is set to "Agg" before pyplot locks the backend; update any uses of
ph.NO_GUI(), the ph import, and the "import matplotlib.pyplot as plt" ordering
accordingly.
♻️ Duplicate comments (1)
src/python3/cpp_simulator.hpp (1)
118-121: Fix invalid template call to makeSimulator.
makeSimulatoris not templated (seesrc/simulator/simulator.hpp), somakeSimulator<Sim>(hier)will not compile. Use the non‑templated call.🐛 Proposed fix
- m.def(name.c_str(), [](std::shared_ptr<PHARE::amr::Hierarchy> const& hier) { - return makeSimulator<Sim>(hier); - }); + m.def(name.c_str(), [](std::shared_ptr<PHARE::amr::Hierarchy> const& hier) { + return makeSimulator(hier); + });
🧹 Nitpick comments (4)
src/core/utilities/meta/meta_utilities.hpp (1)
116-132: Consider adding a brief doc comment to clarify purpose.The function provides sensible defaults and follows the existing pattern. A short documentation comment explaining the distinction from
possibleSimulators()(which provides all valid combinations vs. these defaults) would help future maintainers understand when to use each.📝 Suggested documentation
+ /** \brief Returns the default simulator configurations used by PHARE executables. + * + * Unlike possibleSimulators() which lists all valid combinations, + * this provides a single default particle count per dim/interp pair. + */ constexpr decltype(auto) phare_exe_default_simulators() {src/amr/CMakeLists.txt (1)
86-88: Clarify the “static SAMRAI” comment for a SHARED target.The code now builds a shared library, so the comment is slightly misleading.
♻️ Suggested comment update
-# single source of entry for static SAMRAI things +# single source of entry for SAMRAI lifecycle integrationsrc/amr/samrai.hpp (1)
19-30: Prefer a non-null stream handle and addoverride.Storing a raw pointer allows a null dereference in
logMessage. A reference (or not_null) plusoverridemakes intent explicit.♻️ Suggested refactor
-class StreamAppender : public SAMRAI::tbox::Logger::Appender +class StreamAppender : public SAMRAI::tbox::Logger::Appender { public: - StreamAppender(std::ostream* stream) { d_stream = stream; } - void logMessage(std::string const& message, std::string const& filename, int const line) + explicit StreamAppender(std::ostream& stream) : d_stream(stream) {} + void logMessage(std::string const& message, std::string const& filename, int const line) override { - (*d_stream) << "At :" << filename << " line :" << line << " message: " << message + d_stream << "At :" << filename << " line :" << line << " message: " << message << std::endl; } private: - std::ostream* d_stream; + std::ostream& d_stream; };Remember to adjust the call site in
src/amr/samrai.cppaccordingly.src/python3/CMakeLists.txt (1)
5-9: Consider validating the permutations file exists.If
PHARE_PERMUTATIONSpoints to a non-existent file,file(STRINGS ...)will silently produce an empty list, resulting in nocpp_*modules being built. This could lead to confusing runtime errors.♻️ Suggested validation
if(NOT PHARE_PERMUTATIONS) set(PHARE_PERMUTATIONS "${PHARE_PROJECT_DIR}/res/sim/all.txt") endif() + +if(NOT EXISTS ${PHARE_PERMUTATIONS}) + message(FATAL_ERROR "PHARE_PERMUTATIONS file not found: ${PHARE_PERMUTATIONS}") +endif() file(STRINGS ${PHARE_PERMUTATIONS} THIS_FILE)
lib per sim permutation
lib per sim permutation
for #978
MacOS does not work anymore, so removed from actions
static samrai libraries are not supported
there is some investigations here into how to support that